multiscale representation
Export Reviews, Discussions, Author Feedback and Meta-Reviews
First provide a summary of the paper, and then address the following criteria: Quality, clarity, originality and significance. The paper proposes to extend the fields of experts model to a multiscale image representation, and using a combination of a binary latent variable model and a gray-level image model. The model is applied to images of contours embedded in noise, and black/white drawings embedded in noise, and the results show that the model can do a reasonable job at recovering the underlying binary process. The overall idea here is useful and interesting - i.e., leveraging a multiscale model to capture long-range in addition to short-range structure in images. It is an approach long overdue in image models, which are often applied only to local patches at the highest-grain pixel level.
Multiscale Fields of Patterns
We describe a framework for defining high-order image models that can be used in a variety of applications. The approach involves modeling local patterns in a multiscale representation of an image. Local properties of a coarsened image reflect non-local properties of the original image. In the case of binary images local properties are defined by the binary patterns observed over small neighborhoods around each pixel. With the multiscale representation we capture the frequency of patterns observed at different scales of resolution. This framework leads to expressive priors that depend on a relatively small number of parameters. For inference and learning we use an MCMC method for block sampling with very large blocks. We evaluate the approach with two example applications.
Multiscale Fields of Patterns
Pedro Felzenszwalb, John G. Oberlin
We describe a framework for defining high-order image models that can be used in a variety of applications. The approach involves modeling local patterns in a multiscale representation of an image. Local properties of a coarsened image reflect non-local properties of the original image. In the case of binary images local properties are defined by the binary patterns observed over small neighborhoods around each pixel. With the multiscale representation we capture the frequency of patterns observed at different scales of resolution. This framework leads to expressive priors that depend on a relatively small number of parameters. For inference and learning we use an MCMC method for block sampling with very large blocks. We evaluate the approach with two example applications.
- North America > Canada > Ontario > Toronto (0.14)
- North America > United States > Rhode Island > Providence County > Providence (0.04)
- Europe > Sweden > Östergötland County > Linköping (0.04)
- Information Technology > Artificial Intelligence > Representation & Reasoning (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (1.00)
- Information Technology > Sensing and Signal Processing > Image Processing (0.95)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Undirected Networks > Markov Models (0.71)
Multiscale Fields of Patterns
We describe a framework for defining high-order image models that can be used in a variety of applications. The approach involves modeling local patterns in a multiscale representation of an image. Local properties of a coarsened image reflect non-local properties of the original image. In the case of binary images local properties are defined by the binary patterns observed over small neighborhoods around each pixel. With the multiscale representation we capture the frequency of patterns observed at different scales of resolution.
Model-free Estimation of Latent Structure via Multiscale Nonparametric Maximum Likelihood
Multivariate distributions often carry latent structures that are difficult to identify and estimate, and which better reflect the data generating mechanism than extrinsic structures exhibited simply by the raw data. In this paper, we propose a model-free approach for estimating such latent structures whenever they are present, without assuming they exist a priori. Given an arbitrary density $p_0$, we construct a multiscale representation of the density and propose data-driven methods for selecting representative models that capture meaningful discrete structure. Our approach uses a nonparametric maximum likelihood estimator to estimate the latent structure at different scales and we further characterize their asymptotic limits. By carrying out such a multiscale analysis, we obtain coarseto-fine structures inherent in the original distribution, which are integrated via a model selection procedure to yield an interpretable discrete representation of it. As an application, we design a clustering algorithm based on the proposed procedure and demonstrate its effectiveness in capturing a wide range of latent structures.
- North America > United States > Illinois > Cook County > Chicago (0.04)
- Asia > Middle East > Jordan (0.04)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning > Clustering (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Directed Networks > Bayesian Learning (1.00)
Multiscale Fields of Patterns
We describe a framework for defining high-order image models that can be used in a variety of applications. The approach involves modeling local patterns in a multiscale representation of an image. Local properties of a coarsened image reflect non-local properties of the original image. In the case of binary images local properties are defined by the binary patterns observed over small neighborhoods around each pixel. With the multiscale representation we capture the frequency of patterns observed at different scales of resolution. This framework leads to expressive priors that depend on a relatively small number of parameters. For inference and learning we use an MCMC method for block sampling with very large blocks. We evaluate the approach with two example applications.
- North America > Canada > Ontario > Toronto (0.14)
- North America > United States > Rhode Island > Providence County > Providence (0.04)
- Europe > Sweden > Östergötland County > Linköping (0.04)
- Information Technology > Artificial Intelligence > Representation & Reasoning (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (1.00)
- Information Technology > Sensing and Signal Processing > Image Processing (0.95)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Undirected Networks > Markov Models (0.71)
Multiscale representation of very large environments in the hippocampus of flying bats
Nearly all mammals navigate over large spatial scales in environments that span hundreds of meters to many kilometers. However, very little is known about the neural representations that underlie the coding of such large spaces. Eliav et al. recorded from place cells in the hippocampus of bats as they flew back and forth on an extremely long track (see the Perspective by Wood and Dudchenko). Many place cells had multiple place fields within this large environment. The place field sizes ranged from less than 1 meter up to 32 meters, and the sizes of the different place fields of an individual cell varied as much as 20-fold. Studying animals under naturalistic conditions can reveal new coding principles for the representation of their environment in the brain. Science , abg4020, this issue p. [eabg4020][1]; see also abi9663, p. [913][2] ### INTRODUCTION Place cells are neurons in the hippocampus that represent the animal’s position in space and are important for supporting navigation behaviors. These cells increase their spiking activity when the animal passes through a specific region of space, called the neuron’s “place field.” Since the discovery of place cells half a century ago, nearly all the research on spatial representations in the mammalian brain has focused on rats and mice as animal models and used small laboratory environments as experimental setups—usually small boxes or short linear tracks ~1 to 2 m in size. In such small environments, individual place cells typically have one place field, with a small field size. However, outdoor navigation of all mammals occurs in natural environments that span much larger spatial scales, of hundreds of meters or kilometers, and nothing is known about the neural codes for such large spatial scales. ### RATIONALE We reasoned that in very large environments, the hippocampus must exhibit a different coding scheme than seen in small environments because large environments cannot be tiled fully by the limited number of hippocampal neurons. We set out to discover this alternative coding scheme and thus to close the longstanding gap between the neurobiology of navigation as studied in the laboratory and natural large-scale navigation. To this end, we studied bats flying in a 200-m-long tunnel while we recorded the activity of hippocampal dorsal CA1 neurons using a custom wireless-electrophysiology system. ### RESULTS We found that place cells recorded in the large environment exhibited a multifield, multiscale representation of space: Individual neurons exhibited multiple place fields of diverse sizes, ranging from <1 m to more than 30 m, and the fields of the same neuron could differ up to 20-fold in size. This multifield, multiscale code was observed already from the first day in the environment and was similar between wild-born and laboratory-born bats that were never exposed to large environments. By contrast, recordings from a small-scale 6-m environment did not reveal such a multiscale code but rather classical single fields. Theoretical decoding analysis showed major advantages of the multiscale code over classical single-field codes, both in the number of required neurons and in the decoding errors. Thus, the multiscale code provides an efficient population code with a high capacity for representing very large environments. We conducted neural-network modeling, which suggested that the multiscale code may arise from interacting attractor networks with multiple scales or from feedforward networks, which yielded experimentally testable predictions for the inputs into CA1. ### CONCLUSION Using this experimental setup, our study uncovered a new coding scheme for large spaces, which was never observed before in small spaces: a multiscale code for space. This coding scheme existed from day 1 in the environment and was observed in both wild-born and laboratory-born bats, suggesting that it does not require previous experience. These findings provide a new notion for how the hippocampus represents space. The large naturalistic scale of our experimental environment was crucial for revealing this type of code. More generally, this study demonstrates the power of studying brain circuits under naturalistic conditions. ![Figure][3] Multiscale hippocampal spatial code for very large environments. (Methods) We wirelessly recorded neural activity from hippocampal neurons of bats flying in a 200-m tunnel. (Findings) Single neurons exhibited multiple place fields with highly heterogeneous field sizes for the same neuron. (Function) This multiscale neural code for space strongly outperforms classical single-field place codes. (Modeling) Modeling by using interacting attractor networks and feedforward models recapitulated the multiscale coding. Hippocampal place cells encode the animal’s location. Place cells were traditionally studied in small environments, and nothing is known about large ethologically relevant spatial scales. We wirelessly recorded from hippocampal dorsal CA1 neurons of wild-born bats flying in a long tunnel (200 meters). The size of place fields ranged from 0.6 to 32 meters. Individual place cells exhibited multiple fields and a multiscale representation: Place fields of the same neuron differed up to 20-fold in size. This multiscale coding was observed from the first day of exposure to the environment, and also in laboratory-born bats that never experienced large environments. Theoretical decoding analysis showed that the multiscale code allows representation of very large environments with much higher precision than that of other codes. Together, by increasing the spatial scale, we discovered a neural code that is radically different from classical place codes. [1]: /lookup/doi/10.1126/science.abg4020 [2]: /lookup/doi/10.1126/science.abi9663 [3]: pending:yes
Multiscale Fields of Patterns
Felzenszwalb, Pedro, Oberlin, John G.
We describe a framework for defining high-order image models that can be used in a variety of applications. The approach involves modeling local patterns in a multiscale representation of an image. Local properties of a coarsened image reflect non-local properties of the original image. In the case of binary images local properties are defined by the binary patterns observed over small neighborhoods around each pixel. With the multiscale representation we capture the frequency of patterns observed at different scales of resolution.
Convex Hierarchical Clustering for Graph-Structured Data
Convex clustering is a recent stable alternative to hierarchical clustering. It formulates the recovery of progressively coalescing clusters as a regularized convex problem. While convex clustering was originally designed for handling Euclidean distances between data points, in a growing number of applications, the data is directly characterized by a similarity matrix or weighted graph. In this paper, we extend the robust hierarchical clustering approach to these broader classes of similarities. Having defined an appropriate convex objective, the crux of this adaptation lies in our ability to provide: (a) an efficient recovery of the regularization path and (b) an empirical demonstration of the use of our method. We address the first challenge through a proximal dual algorithm, for which we characterize both the theoretical efficiency as well as the empirical performance on a set of experiments. Finally, we highlight the potential of our method by showing its application to several real-life datasets, thus providing a natural extension to the current scope of applications of convex clustering.
- North America > United States > California > Santa Clara County > Stanford (0.04)
- North America > United States > California > Santa Clara County > Palo Alto (0.04)
- Health & Medicine > Pharmaceuticals & Biotechnology (1.00)
- Health & Medicine > Therapeutic Area > Neurology (0.46)
Multiscale Fields of Patterns
Felzenszwalb, Pedro, Oberlin, John G.
We describe a framework for defining high-order image models that can be used in a variety of applications. The approach involves modeling local patterns in a multiscale representation of an image. Local properties of a coarsened image reflect non-local properties of the original image. In the case of binary images local properties are defined by the binary patterns observed over small neighborhoods around each pixel. With the multiscale representation we capture the frequency of patterns observed at different scales of resolution. This framework leads to expressive priors that depend on a relatively small number of parameters. For inference and learning we use an MCMC method for block sampling with very large blocks. We evaluate the approach with two example applications. One involves contour detection. The other involves binary segmentation.
- North America > Canada > Ontario > Toronto (0.14)
- North America > United States > Rhode Island > Providence County > Providence (0.04)
- North America > Canada > Alberta (0.04)
- Europe > Sweden > Östergötland County > Linköping (0.04)
- Information Technology > Artificial Intelligence > Representation & Reasoning (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (1.00)
- Information Technology > Sensing and Signal Processing > Image Processing (0.95)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Undirected Networks > Markov Models (0.71)